Reservoir stack machines

نویسندگان

چکیده

Memory-augmented neural networks equip a recurrent network with an explicit memory to support tasks that require information storage without interference over long times. A key motivation for such research is perform classic computation tasks, as parsing. However, memory-augmented are notoriously hard train, requiring many backpropagation epochs and lot of data. In this paper, we introduce the reservoir stack machine, model which can provably recognize all deterministic context-free languages circumvents training problem by only output layer net employing auxiliary during about desired interaction stack. our experiments, validate machine against deep shallow from literature on three benchmark Neural Turing machines six languages. Our results show achieves zero error, even test sequences longer than data, few seconds time 100 sequences.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Global Stack Allocation – Register Allocation for Stack Machines

Register allocation is a critical part of any compiler, yet register allocation for stack machines has received relatively little attention in the past. We present a framework for the analysis of register allocation methods for stack machines which has allowed us to analyse current methods. We have used this framework to design the first truly procedure-wide register allocation methods for stac...

متن کامل

Simulations of Quantum Turing Machines by Quantum Multi-Stack Machines

As was well known, in classical computation, Turing machines, circuits, multi-stack machines, and multi-counter machines are equivalent, that is, they can simulate each other in polynomial time. In quantum computation, Yao [11] first proved that for any quantum Turing machines M , there exists quantum Boolean circuit (n, t)-simulating M , where n denotes the length of input strings, and t is th...

متن کامل

Extended Macro Grammars and Stack Controlled Machines

K-extended basic macro grammars are introduced, where K is any class of languages. The class B(K) of languages generated by such grammars is investigated, together with the class LB(K) of languages generated by the corresponding linear basic grammars. For any full semiAFL K, B(K) is a full AFL closed under iterated LB(K)-substitution, but not necessarily under substitution. For any machine type...

متن کامل

Learning Operations on a Stack with Neural Turing Machines

Multiple extensions of Recurrent Neural Networks (RNNs) have been proposed recently to address the difficulty of storing information over long time periods. In this paper, we experiment with the capacity of Neural Turing Machines (NTMs) to deal with these long-term dependencies on well-balanced strings of parentheses. We show that not only does the NTM emulate a stack with its heads and learn a...

متن کامل

Pseudorandomness for Linear Length Branching Programs and Stack Machines

We show the existence of an explicit pseudorandom generator G of linear stretch such that for every constant k, the output of G is pseudorandom against: • Oblivious branching programs over alphabet {0, 1} of length kn and size 2 logn) on inputs of size n. • Non-oblivious branching programs over alphabet Σ of length kn, provided the size of Σ is a power of 2 and sufficiently large in terms of k....

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Neurocomputing

سال: 2022

ISSN: ['0925-2312', '1872-8286']

DOI: https://doi.org/10.1016/j.neucom.2021.05.106